Non-Asymptotic Analysis of Block-Regularized Regression Problem
نویسندگان
چکیده
Given a linear multivariate regression problem with block sparsity structure on the regression matrix, one popular approach for estimating its unknown parameter is blockregularization, where the sparsity of different blocks of the regression matrix is promoted by penalizing their `∞-norms. The main goal of this work is to characterize the properties of this estimator under high-dimensional scaling, where the growth rate of the dimension of the problem is comparable or even faster than that of the sample size. In particular, this work generalizes the existing non-asymptotic results on special instances of blockregularized estimators to the case where the unknown regression matrix has an arbitrary number of blocks each with a potentially different size. When the design matrix is deterministic, a sharp non-asymptotic rate is derived on the element-wise error of the proposed estimator. Furthermore, it is proven that the same error rate approximately holds for the block-regularized estimator when the design matrix is randomly generated, provided that the number of samples exceeds a lower bound. The accuracy of the proposed estimator is illustrated on several test cases.
منابع مشابه
High-Dimensional Structured Quantile Regression
Quantile regression aims at modeling the conditional median and quantiles of a response variable given certain predictor variables. In this work we consider the problem of linear quantile regression in high dimensions where the number of predictor variables is much higher than the number of samples available for parameter estimation. We assume the true parameter to have some structure character...
متن کاملSuper-Linear Convergence of Dual Augmented Lagrangian Algorithm for Sparsity Regularized Estimation
We analyze the convergence behaviour of a recently proposed algorithm for regularized estimation called Dual Augmented Lagrangian (DAL). Our analysis is based on a new interpretation of DAL as a proximal minimization algorithm. We theoretically show under some conditions that DAL converges super-linearly in a non-asymptotic and global sense. Due to a special modelling of sparse estimation probl...
متن کاملA coordinate gradient descent method for ℓ1-regularized convex minimization
In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...
متن کاملA Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
This paper considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. We review some of its interesting examples and propose a generalized block coordinate descent method. (Using proximal updates, we further allow non-convexity over some blocks.) Under certain conditions, we show that any l...
متن کاملOn the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process
We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...
متن کامل